Skorzystaj z wyszukiwarki lub indeksu alfabetycznego.
Przykłady: pci, /dev/null, functional unit, embedded system, pseudo-tty, nfs.
1 definition found
From The Free On-line Dictionary of Computing (05 January 2017) [foldoc]:
standard deviation
(SD) A measure of the range of values in a set of
numbers. Standard deviation is a statistic used as a measure
of the dispersion or variation in a distribution, equal to the
square root of the arithmetic mean of the squares of the
deviations from the arithmetic mean.
The standard deviation of a random variable or list of numbers
(the lowercase greek sigma) is the square of the variance.
The standard deviation of the list x1, x2, x3...xn is given by
the formula:
sigma = sqrt(((x1-(avg(x)))^2 + (x1-(avg(x)))^2 +
... + (xn(avg(x)))^2)/n)
The formula is used when all of the values in the population
are known. If the values x1...xn are a random sample chosen
from the population, then the sample Standard Deviation is
calculated with same formula, except that (n-1) is used as the
denominator.
[dictionary.com http://dictionary.com/].
["Barrons Dictionary of Mathematical Terms, second edition"].
(2003-05-06)